Protein Subcellular Localization with Gaussian Kernel Discriminant Analysis and Its Kernel Parameter Selection
نویسندگان
چکیده
Kernel discriminant analysis (KDA) is a dimension reduction and classification algorithm based on nonlinear kernel trick, which can be novelly used to treat high-dimensional and complex biological data before undergoing classification processes such as protein subcellular localization. Kernel parameters make a great impact on the performance of the KDA model. Specifically, for KDA with the popular Gaussian kernel, to select the scale parameter is still a challenging problem. Thus, this paper introduces the KDA method and proposes a new method for Gaussian kernel parameter selection depending on the fact that the differences between reconstruction errors of edge normal samples and those of interior normal samples should be maximized for certain suitable kernel parameters. Experiments with various standard data sets of protein subcellular localization show that the overall accuracy of protein classification prediction with KDA is much higher than that without KDA. Meanwhile, the kernel parameter of KDA has a great impact on the efficiency, and the proposed method can produce an optimum parameter, which makes the new algorithm not only perform as effectively as the traditional ones, but also reduce the computational time and thus improve efficiency.
منابع مشابه
Fisher’s Linear Discriminant Analysis for Weather Data by reproducing kernel Hilbert spaces framework
Recently with science and technology development, data with functional nature are easy to collect. Hence, statistical analysis of such data is of great importance. Similar to multivariate analysis, linear combinations of random variables have a key role in functional analysis. The role of Theory of Reproducing Kernel Hilbert Spaces is very important in this content. In this paper we study a gen...
متن کاملDiscriminant Kernel Learning Discriminant Kernel Learning via Convex Programming
Regularized Kernel Discriminant Analysis (RKDA) performs linear discriminant analysis in the feature space via the kernel trick. Its performance depends on the selection of kernels. We show that this kernel learning problem can be formulated as a semidefinite program (SDP). Based on the equivalence relationship between RKDA and least square problems in the binary-class case, we propose an effic...
متن کاملKernel Fisher Discriminant Analysis in Gaussian Reproducing Kernel Hilbert Spaces –Theory
Kernel Fisher discriminant analysis (KFDA) has been proposed for nonlinear binary classification. It is a hybrid method of the classical Fisher linear discriminant analysis and a kernel machine. Experimental results have shown that the KFDA performs slightly better in terms of prediction error than the popular support vector machines and is a strong competitor to the latter. However, there is v...
متن کاملISAR Image Improvement Using STFT Kernel Width Optimization Based On Minimum Entropy Criterion
Nowadays, Radar systems have many applications and radar imaging is one of the most important of these applications. Inverse Synthetic Aperture Radar (ISAR) is used to form an image from moving targets. Conventional methods use Fourier transform to retrieve Doppler information. However, because of maneuvering of the target, the Doppler spectrum becomes time-varying and the image is blurred. Joi...
متن کاملAdaptive Quasiconformal Kernel Fisher Discriminant Analysis via Weighted Maximum Margin Criterion
Kernel Fisher discriminant analysis (KFD) is an effective method to extract nonlinear discriminant features of input data using the kernel trick. However, conventional KFD algorithms endure the kernel selection problem as well as the singular problem. In order to overcome these limitations, a novel nonlinear feature extraction method called adaptive quasiconformal kernel Fisher discriminant ana...
متن کامل